Numerical characterization of support recovery in sparse regression with correlated design
نویسندگان
چکیده
Sparse regression is employed in diverse scientific settings as a feature selection method. A pervasive aspect of data the presence correlations between predictive features. These hamper both and estimation jeopardize conclusions drawn from estimated models. On other hand, theoretical results on sparsity-inducing regularized have largely addressed conditions for consistency via asymptotics, disregard problem model selection, whereby regularization parameters are chosen. In this numerical study, we address these issues through exhaustive characterization performance several estimators, coupled with range strategies. estimators criteria were examined across correlated problems varying degrees signal to noise, distributions non-zero coefficients, sparsity. Our reveal fundamental tradeoff false positive negative control all examined. Additionally, numerically explore transition point modulated by signal-to-noise ratio spectral properties design covariance matrix at which accuracy considered algorithms degrades. Overall, find that SCAD BIC or empirical Bayes performs best considered.
منابع مشابه
Sparse regression with highly correlated predictors
We consider a linear regression y = Xβ + u where X ∈ Rn×p, p n, and β is s−sparse. Motivated by examples in financial and economic data, we consider the situation where X has highly correlated and clustered columns. To perform sparse recovery in this setting, we introduce the clustering removal algorithm (CRA), that seeks to decrease the correlation in X by removing the cluster structure withou...
متن کاملRobust Estimation in Linear Regression with Molticollinearity and Sparse Models
One of the factors affecting the statistical analysis of the data is the presence of outliers. The methods which are not affected by the outliers are called robust methods. Robust regression methods are robust estimation methods of regression model parameters in the presence of outliers. Besides outliers, the linear dependency of regressor variables, which is called multicollinearity...
متن کاملNumerical methods for sparse recovery
These lecture notes are an introduction to methods recently developed for performing numerical optimizations with linear model constraints and additional sparsity conditions to solutions, i.e. we expect solutions which can be represented as sparse vectors with respect to a prescribed basis. Such a type of problems has been recently greatly popularized by the development of the field of nonadapt...
متن کاملSparse Support Recovery with Phase-Only Measurements
Sparse support recovery (SSR) is an important part of the compressive sensing (CS). Most of the current SSR methods are with the full information measurements. But in practice the amplitude part of the measurements may be seriously destroyed. The corrupted measurements mismatch the current SSR algorithms, which leads to serious performance degeneration. This paper considers the problem of SSR w...
متن کاملSparse Recovery with Partial Support Knowledge
The goal of sparse recovery is to recover the (approximately) best k-sparse approximation x̂ of an n-dimensional vector x from linear measurements Ax of x. We consider a variant of the problem which takes into account partial knowledge about the signal. In particular, we focus on the scenario where, after the measurements are taken, we are given a set S of size s that is supposed to contain most...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Communications in Statistics - Simulation and Computation
سال: 2022
ISSN: ['0361-0918', '1532-4141']
DOI: https://doi.org/10.1080/03610918.2022.2050392